Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Active Directory Timeline module #923

Merged
merged 4 commits into from
Apr 5, 2024

Conversation

cert-cwatch
Copy link
Contributor

Description

ADtimeline is a powershell script created by the ANSSI (French Cybersecurity Agency).
It's a powerfull live DFIR script when analysing a potential compromised active directory environnement.

Checklist:

Please replace every instance of [ ] with [X] OR click on the checkboxes after you submit your PR

  • I have generated a unique GUID for my Target(s)/Module(s)
  • I have placed the Target(s)/Module(s) in an appropriate subfolder in Targets or Modules. If one doesn't exist, I have either added it to the Misc folder or created a relevant subfolder with justification
  • I have set or updated the version of my Target(s)/Module(s)
  • I have verified that KAPE parses the Target(s)/Module(s) successfully via kape.exe, using --tlist/--mlist and corrected any errors
  • I have validated my Target(s)/Module(s) against test data and verified they are working as intended
  • I have made an attempt to document the artifacts within the Target(s) or Module(s) I am submitting. If documentation doesn't exist, I have placed N/A underneath the Documentation header
  • For Targets, I have consulted either the Target Guide, Target Template, Compound Target Guide, or Compound Target Template to ensure my Target(s) follow the same format
  • For Modules, I have consulted either the Module Guide, Module Template, Compound Module Guide, or Compound Module Template to ensure my Module(s) follow the same format

If your submission involves an SQLite database, have you considered making an SQLECmd Map for the SQLite database? If you make a Map, please add the SQLite database to the SQLiteDatabases.tkape Compound Target.

Thank you for your submission and for contributing to the DFIR community!

@AndrewRathbun AndrewRathbun self-assigned this Apr 2, 2024
@AndrewRathbun AndrewRathbun self-requested a review April 2, 2024 18:40
Copy link

@Champipote Champipote left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review ok. Some corrections needed for PowerShell and some spare space.

@AndrewRathbun
Copy link
Collaborator

Review ok. Some corrections needed for PowerShell and some spare space.

Can you review my comments and address those before I merge? I don't think the Module will work in its current state.

@cert-cwatch
Copy link
Contributor Author

Hey @AndrewRathbun, thanks for the review.
I do not see any comments in the commit. Could you please indicate us where are they located ?

We have tested this module on two differents test domain controller and it works perfectly :

image

2024-04-03T12_57_55_8488497_ConsoleLog.txt

@AndrewRathbun
Copy link
Collaborator

AndrewRathbun commented Apr 3, 2024

We have tested this module on two differents test domain controller and it works perfectly :

image

2024-04-03T12_57_55_8488497_ConsoleLog.txt

It may "work" but the way it's currently structured isn't the cleanest.

https://github.com/EricZimmerman/KapeFiles/blob/master/Modules%2FEZTools%2FLECmd.mkape

Look at this one for example. Line 7 specifies which format KAPE should default to if not specified otherwise. Then, the separate formats are each in their own processors, and not listed all on the same line. Right now, it may work but we should clean this up before merging. Hopefully the LECmd example helps!

@EricZimmerman
Copy link
Owner

If there's a single processor that's the only one that will ever be used, so you don't even need an export format.

@AndrewRathbun
Copy link
Collaborator

Hey @AndrewRathbun, thanks for the review.
I do not see any comments in the commit. Could you please indicate us where are they located ?

Go to the Files Changed tab, here's one of them:

#923 (comment)

@AndrewRathbun
Copy link
Collaborator

If there's a single processor that's the only one that will ever be used, so you don't even need an export format.

Or, if you want a processor for each of those 3 formats, they need to be separated out. Currently, in your command, I see flags for XML, so that should be your XML processor. If you want the other output formats, modify the command to reflect that and make a separate processor for each 👍

@b1draper
Copy link

b1draper commented Apr 3, 2024

Hey guys, I've used the ADTimeline a lot "offline manually" in several IRs. Having to collect the NTDS folder and process it offline isn't always fun. The offline processing requires that the server that's running dsamain have an OS version that matches the ntds.dit that's being processed. I've talked with the developers a little in the past because the requirements for offline processing weren't mentioned. In an off-line situation I've gotten to where it'll run nearly 100% of the time and if it doesn't after a little tweaking I can get it to run. The biggest hurdle is getting dsamain to successfully mount a dirty .dit file. The Splunk app that they offer does an awesome job visualizing the data. Getting customers to run the tools live on a DC as they're supplied from the developer has been an issue, due to their learning curve. Looking forward to seeing this work on some of my response engagements.

@AndrewRathbun
Copy link
Collaborator

Hey guys, I've used the ADTimeline a lot "offline manually" in several IRs. Having to collect the NTDS folder and process it offline isn't always fun. The offline processing requires that the server that's running dsamain have an OS version that matches the ntds.dit that's being processed. I've talked with the developers a little in the past because the requirements for offline processing weren't mentioned. In an off-line situation I've gotten to where it'll run nearly 100% of the time and if it doesn't after a little tweaking I can get it to run. The biggest hurdle is getting dsamain to successfully mount a dirty .dit file. The Splunk app that they offer does an awesome job visualizing the data. Getting customers to run the tools live on a DC as they're supplied from the developer has been an issue, due to their learning curve. Looking forward to seeing this work on some of my response engagements.

In my understanding, NTDS.dit is simply an ESE database, so much like SRUM/SUM/Windows Search Index, etc, the database will be dirty when it's acquired live. It'll need to be repaired by an OS equal or newer than the system it was acquired from due to the version of the JetEngine used to create it initially. An easy test would be to try to repair an ESE DB acquired from W10 on W11 and watch it work. Do the opposite and it won't work.

esentutl.exe /mh <file> should show you the engine versions in question for the database being repaired and the version of the engine from your machine.

https://www.stellarinfo.com/article/check-exchange-database-state-using-eseutil-mh-command.php

@b1draper
Copy link

b1draper commented Apr 3, 2024

@AndrewRathbun Are there any plans to work with an offline collection of the NTDS folder?

@AndrewRathbun
Copy link
Collaborator

@AndrewRathbun Are there any plans to work with an offline collection of the NTDS folder?

https://github.com/EricZimmerman/KapeFiles/blob/master/Targets%2FWindows%2FActiveDirectoryNTDS.tkape

make CSV the default ExportFormat, and rename the ExportFormat for the first processor to CSV
Copy link
Collaborator

@AndrewRathbun AndrewRathbun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I took the liberty of making a change to the Module so we can merge it. The ExportFormat(s) were changed to simply CSV, instead of multiple comma-separated values. The command can have all the outputs it wants, but specifying an ExportFormat should be singular in nature as no other Module has multiple values on the same ExportFormat line, rather, they are separated out into multiple processors, one for each format. However, I don't think this Module is intended to work like that, so I'm simplifying it by just calling it CSV and moving on. If there are any issues with that, do another PR to fix it, but this will keep the format clean and in line with every other Module.

@AndrewRathbun AndrewRathbun merged commit d20ab7b into EricZimmerman:master Apr 5, 2024
1 check passed
@AndrewRathbun
Copy link
Collaborator

Also, I didn't catch this the first time, but GitHub was spelled GGitHub, and therefore made an erroneous folder. I fixed it here: 6ffa71d

@cert-cwatch
Copy link
Contributor Author

Nice, thanks for your actions @AndrewRathbun !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants